Journals
  Publication Years
  Keywords
Search within results Open Search
Please wait a minute...
For Selected: Toggle Thumbnails
Improved capsule network based on multipath feature
Qinghai XU, Shifei DING, Tongfeng SUN, Jian ZHANG, Lili GUO
Journal of Computer Applications    2023, 43 (5): 1330-1335.   DOI: 10.11772/j.issn.1001-9081.2022030367
Abstract336)   HTML40)    PDF (1560KB)(248)       Save

Concerning the problems of poor classification of Capsule Network (CapsNet) on complex datasets and large number of parameters in the routing process, a Capsule Network based on Multipath feature (MCNet) was proposed, including a novel capsule feature extractor and a novel capsule pooling method. By the capsule feature extractor, the features of different layers and locations were extracted in parallel from multiple paths, and then the features were encoded into capsule features containing more semantic information. In the capsule pooling method, the most active capsules at each position of the capsule feature map were selected, and the effective capsule features were represented by a small number of capsules. Comparisons were performed on four datasets (CIFAR-10, SVHN, Fashion-MNIST, MNIST) with models such as CapsNet. Experimental results show that MCNet has the classification accuracy of 79.27% on CIFAR-10 dataset and the number of trainable parameters of 6.25×106; compared with CapsNet, MCNet has the classification accuracy improved by 8.7%, and the number of parameters reduced by 46.8%. MCNet can effectively improve the classification accuracy while reducing the number of trainable parameters.

Table and Figures | Reference | Related Articles | Metrics
Time series clustering based on new robust similarity measure
LI Guorong, YE Jimin, ZHEN Yuanting
Journal of Computer Applications    2021, 41 (5): 1343-1347.   DOI: 10.11772/j.issn.1001-9081.2020071142
Abstract329)      PDF (683KB)(346)       Save
For time series data with outliers, a Robust Generalized Cross-Correlation measure (RGCC) between time series based on robust estimation of correlation coefficient was proposed. First, a robust correlation coefficient was introduced to replace Pearson correlation coefficient to calculate the covariance matrix between time series data. Second, the determinant of the new covariance matrix was used to construct a similarity measure between two time series named RGCC. Finally, the distance matrix between the time series was calculated based on this measure, and the matrix was used as the input of the clustering algorithm to cluster the data. Time series clustering simulation experiments showed that for time series data with outliers, the clustering results based on RGCC were obviously closer to the real ones compared to the clustering results based on the original Generalized Cross-Correlation measure (GCC). It can be seen that the proposed new robust similarity measure is fully applicable to time series data with outliers.
Reference | Related Articles | Metrics
Construction and correlation analysis of national food safety standard graph
QIN Li, HAO Zhigang, LI Guoliang
Journal of Computer Applications    2021, 41 (4): 1005-1011.   DOI: 10.11772/j.issn.1001-9081.2020081311
Abstract383)      PDF (2022KB)(628)       Save
National Food Safety Standards(NFSS) are not only the operation specifications of food producers, but also the law enforcement criteria of food safety supervision. However, there are various NFSSs with a wide range of contents and complicated inter-reference relationships. To systematically study the contents and structures of NFSSs, it is necessary to extract the knowledges and mine the reference relationships in NFSSs. First, the contents of the standard files and the reference relationship between the standard files were extracted as knowledge triplets through the Knowledge Graph(KG) technology, and the triplets were used to construct the NFSS knowledge graph. Then, this knowledge graph was linked to the food production process ontology which was made manually based on Hazard Analysis Critical Control Point(HACCP) standards, so that the food safety standards and the related food production processes can be referenced to each other. At the same time, the Louvain community discovery algorithm was used to analyze the standard reference network in the knowledge graph, and the standards with high citations as well as their types in NFSSs were obtained. Finally, a question answering system was built using gStore's Application Programming Interface(API) and Django, which realized the knowledge retrieval and reasoning based on natural language, and the high-impact NFSSs in the graph could be found under specified requirements.
Reference | Related Articles | Metrics
Double ring mapping projection for panoramic video
LIN Chang, LI Guoping, ZHAO Haiwu, WANG Guozhong, GU Xiao
Journal of Computer Applications    2017, 37 (9): 2631-2635.   DOI: 10.11772/j.issn.1001-9081.2017.09.2631
Abstract656)      PDF (829KB)(584)       Save
To solve the problem that some area deforms too large and large volume of redundant data in the process of panoramic video mapping, a Double-Ring mapping Projection (DRP) algorithm was proposed. Firstly, according to the geometric characteristics of spherical video and the visual characteristics of Human Visual System (HVS), the spherical video was divided into 14 equal-sized regions by two mutually orthogonal ring areas regions. Then, according to the space domain sampling theorem, the 14 regions corresponding to the spherical video content were mapped to 14 rectangular videos with the same size by Lanczos interpolation method. Finally, according to the characteristics of the latest video coding standard, the 14 rectangular videos were rearranged to get a compact panoramic video in keeping with coding standards. The experimental results show that the DRP algorithm achieve a higher compression efficiency compared with the EquiRectangular mapping Projection (ERP) algorithm, OctaHedral mapping Projection (OHP) algorithm and IcoSahedral mapping Projection (ISP) algorithm. Specifically, compared with the most popular ERP algorithm, the proposed method reduces the bit rate by 8.61% which obviously improves the coding efficiency.
Reference | Related Articles | Metrics
Sparse signal reconstruction optimization algorithm based on recurrent neural network
WANG Xingxing, LI Guocheng
Journal of Computer Applications    2017, 37 (9): 2590-2594.   DOI: 10.11772/j.issn.1001-9081.2017.09.2590
Abstract537)      PDF (720KB)(427)       Save
Aiming at the problem of sparse signal reconstruction, an optimization algorithm based on Recurrent Neural Network (RNN) was proposed. Firstly, the signal sparseness was represented, and the mathematical model was transformed into an optimization problem. Then, based on the fact that the l 0-norm is a non-convex and non-differentiable function, and the optimization problem is NP-hard, under the premise that the measurement matrix A met Restricted Isometry Property (RIP), the equivalent optimization problem was proposed. Finally, the corresponding Hopfield RNN model was established to solve the equivalent optimization problem, so as to reconstruct sparse signals. The experimental results show that under different observation number m, compared the RNN algorithm and the other three algorithms, it is found that the relative error of the RNN algorithm is smaller and the observations number is smaller, and the RNN algorithm can reconstruct the sparse signals efficiently.
Reference | Related Articles | Metrics
Improved algorithm of artificial bee colony based on Spark
ZHAI Guangming, LI Guohe, WU Weijiang, HONG Yunfeng, ZHOU Xiaoming, WANG Jing
Journal of Computer Applications    2017, 37 (7): 1906-1910.   DOI: 10.11772/j.issn.1001-9081.2017.07.1906
Abstract533)      PDF (766KB)(488)       Save
To combat low efficiency of Artificial Bee Colony (ABC) algorithm on solving combinatorial problem, a parallel ABC optimization algorithm based on Spark was presented. Firstly, the bee colony was divided into subgroups among which broadcast was used to transmit data, and then was constructed as a resilient distributed dataset. Secondly, a series of transformation operators were used to achieve the parallelization of the solution search. Finally, gravitational mass calculation was used to replace the roulette probability selection and reduce the time complexity. The simulation results in solving the Traveling Salesman Problem (TSP) prove the feasibility of the proposed parallel algorithm. The experimental results show that the proposed algorithm provides a 3.24x speedup over the standard ABC algorithm and its convergence speed is increased by about 10% compared with the unimproved parallel ABC algorithm. It has significant advantages in solving high dimensional problems.
Reference | Related Articles | Metrics
Scale adaptive improvement of kernel correlation filter tracking algorithm
QIAN Tanghui, LUO Zhiqing, LI Guojia, LI Yingyun, LI Xiankai
Journal of Computer Applications    2017, 37 (3): 811-816.   DOI: 10.11772/j.issn.1001-9081.2017.03.811
Abstract560)      PDF (961KB)(591)       Save
To solve the problem that Circulant Structure of tracking-by-detection with Kernels (CSK) is difficult to adapt to the target scale change, a multi-scale kernel correlation filter classifier was proposed to realize the scale adaptive target tracking. Firstly, the multi-scale image was used to construct the sample set, the multi-scale kernel correlation filtering classifier was trained by the sample set, for target size estimation to achieve the goal of the optimal scale detection, and then the samples collected on the optimal target scale were used to update the classifier on-line to achieve the scale-adaptive target tracking. The comparative experiments and analysis illustrate that the proposed algorithm can adapt to the scale change of the target in the tracking process, the error of the eccentricity is reduced to 1/5 to 1/3 that of CSK algorithm, which can meet the needs of long time tracking in complex scenes.
Reference | Related Articles | Metrics
User discovery based on loyalty in social networks
XUE Yun, LI Guohe, WU Weijiang, HONG Yunfeng, ZHOU Xiaoming
Journal of Computer Applications    2017, 37 (11): 3095-3100.   DOI: 10.11772/j.issn.1001-9081.2017.11.3095
Abstract479)      PDF (869KB)(492)       Save
Aiming at improving the users' high viscosity in social networks, an algorithm based on user loyalty in social network system was proposed. In the proposed algorithm, double Recency Frequency Monetary (RFM) model was used for mining the different loyalty kinds of users. Firstly, according to the double RFM model, the users' consumption value and behavior value were calculated dynamically and the loyalty in a certain time was got. Secondly, the typical loyal users and disloyal users were found out by using the founded standard curve and similarity calculation. Lastly, the potential loyal and disloyal users were found out by using modularity-based community discovery and independent cascade propagation model. On some microblog datasets of a social network, the quantitative representation of user loyalty was confirmed in Social Network Service (SNS), thus the users could be distinguished based on users' loyalty. The experimental results show that the proposed algorithm can be used to effectively dig out different loyalty kinds of users, and can be applied to personalized recommendation, marketing, etc. in the social network system.
Reference | Related Articles | Metrics
Improved algorithm for sample adaptive offset filter based on AVS2
CHEN Zhixian, WANG Guozhong, ZHAO Haiwu, LI Guoping, TENG Guowei
Journal of Computer Applications    2016, 36 (5): 1362-1365.   DOI: 10.11772/j.issn.1001-9081.2016.05.1362
Abstract589)      PDF (695KB)(423)       Save
Sample Adaptive Offset (SAO) is a time-consuming part of in-loop filter in the second generation of Audio Video coding Standard (AVS2) and High Efficiency Video Coding (HEVC) standard. Aiming at the problem that existing SAO algorithms had large amounts of computation and high complexity, an improved fast rate-distortion algorithm was proposed. In this new method, the original defined table of the offset values and its binary bit string to be written into the code stream were modified by analyzing the relationship between the different offset values of each class in the edge mode and its change of the rate-distortion, so that an early termination condition was set to quickly find the best offset value for the current SAO unit without calculating the rate-distortion cost of each offset. The experimental results show that, compared with the calculation results in AVS2, the proposed algorithm reduces not only the calculation amounts but also the number of cycles by 75% to find the best offset values and the operating time of in-loop filter by 33%, which effectively lowers the complexity of the calculation in ensuring the rate-distortion of image barely changed.
Reference | Related Articles | Metrics
Surface reconstruction for scattered point clouds with adaptive α-shape
HE Hua, LI Zongchun, LI Guojun, RUAN Huanli, LONG Changyu
Journal of Computer Applications    2016, 36 (12): 3394-3397.   DOI: 10.11772/j.issn.1001-9081.2016.12.3394
Abstract580)      PDF (734KB)(369)       Save
The α-shape algorithm is not suitable for surface reconstruction of scattered and non-uniformly sampled points. In order to solve the problem, an improved surface reconstruction algorithm with adaptive α-shape based on Local Feature Size (LFS) of point cloud data was proposed. Firstly, Medial Axis (MA) of the surface was approximated by the negative poles computed by k-nearest neighbors of sampled points. Secondly, the LFS of sampled points was calculated by the approximated MA, and the original point clouds were unequally simplified based on LFS. Finally, the surface was adaptively reconstructed based on the radius of circumscribed ball of triangles and the corresponding α value. In the comparison experiments with α-shape algorithm, the proposed algorithm could effectively and reasonably reduce the number of point clouds, and the simplification rate of point clouds achieved about 70%. Simultaneously, the reconstruction result were obtained with less redundant triangles and few holes. The experimental results show that the proposed algorithm can adaptively reconstruct the surface of non-uniformly sampled point clouds.
Reference | Related Articles | Metrics
Wolf pack algorithm based on modified search strategy
LI Guoliang, WEI Zhenhua, XU Lei
Journal of Computer Applications    2015, 35 (6): 1633-1636.   DOI: 10.11772/j.issn.1001-9081.2015.06.1633
Abstract612)      PDF (724KB)(508)       Save

Aiming at the shortcomings of Wolf Pack Algorithm (WPA), such as slow convergence, being easy to fall into local optimum and unsatisfactory artificial wolf interactivity, a wolf pack algorithm based on modified search strategy was proposed, which named Modified Wolf Pack Algorithm (MWPA). In order to promote the exchange of information between the artificial wolves, improve the wolves' grasp of the global information and enhance the exploring ability of wolves, the interactive strategy was introduced into scouting behaviors and summoning behaviors. An adaptive beleaguering strategy was proposed for beleaguering behaviors, which made the algorithm have a regulatory role. With the constant evolution of algorithm, the beleaguered range of wolves decreased constantly and the exploitation ability of algorithm strengthened constantly. Thus the convergence rate of algorithm was enhanced. The simulation results of six typical complex functions of optimization problems show that compared to the Wolf Colony search Algorithm based on the strategy of the Leader (LWCA), the proposed method obtains higher solving accuracy, faster convergence speed and is especially suitable for function optimization problems.

Reference | Related Articles | Metrics
Metadata management mechanism of massive spatial data storage
YANG Wenhui, LI Guoqiang, MIAO Fang
Journal of Computer Applications    2015, 35 (5): 1276-1279.   DOI: 10.11772/j.issn.1001-9081.2015.05.1276
Abstract585)      PDF (643KB)(633)       Save

In order to manage the metadata of massive spatial data storage effectively, a distributed metadata server management structure based on consistent hashing was introduced, and on this basis, a metadata wheeled backup strategy was proposed in this paper, which stored Hash metadata node after excuting a consistent Hash algorithm according to the method of data backup, and it effectively alleviated the single point of metadata management and access bottleneck problems. Finally testing wheel backup strategy, it obtained the optimum number of metadata node backup solution. Compared with the single point of metadata servers, the proposed strategy improves the metadata safety, reduces the access delay, and improves the load balance of distributed metadata server combined with virtual nodes.

Reference | Related Articles | Metrics
Improved artificial bee colony algorithm using phased search
LI Guoliang, WEI Zhenhua, XU Lei
Journal of Computer Applications    2015, 35 (4): 1057-1061.   DOI: 10.11772/j.issn.1001-9081.2015.04.1057
Abstract1090)      PDF (707KB)(886)       Save

Aiming at the shortcomings of Artificial Bee Colony (ABC) algorithm and its improved algorithms in solving high-dimensional complex function optimization problems, such as low solution precision, slow convergence, being easy to fall in local optimum and too many control parameters of improved algorithms, an improved artificial bee colony algorithm using phased search was proposed. In this algorithm, to reduce the probability of being falling into local extremum, the segmental-search strategy was used to make the employed bees have different characteristics in different stages of search. The escape radius was defined to guide the precocity individual to jump out of the local extremum and avert the blindness of escape operation. Meanwhile, to improve the quality of initialization food sources, the uniform distribution method and opposition-based learning theory were used. The simulation results of eight typical high-dimensional complex functions of optimization problems show that the proposed method not only obtains higher solving accuracy, but also has faster convergence speed. It is especially suitable for solving high-dimensional optimization problems.

Reference | Related Articles | Metrics
Fast selection algorithm for intra prediction in AVS2
ZHAO Chao, ZHAO Haiwu, WANG Guozhong, LI Guoping, TENG Guowei
Journal of Computer Applications    2015, 35 (11): 3284-3287.   DOI: 10.11772/j.issn.1001-9081.2015.11.3284
Abstract400)      PDF (650KB)(412)       Save
For Audio Video coding Standard Ⅱ(AVS2) intra-prediction mode determination process is complicated to calculate, and the popularity of ultra-high definition video put encoding and decoding system under great pressure, a kind of fast intra prediction algorithm was presented in this paper. The algorithm selected the part of the Smallest Coding Unit (SCU) prediction mode,reducing the amount of computation of the underlying SCU, and then the upper layer Coding Unit (CU) obtained the prediction mode by the lower CU prediction mode, thereby reducing the amount of computation of the upper CU. The experimental results show that the impact on the compression efficiency of the algorithm is very small, the encoding time on average decreases more than 15 percent, and can effectively reduce the complexity of intra-coding.
Reference | Related Articles | Metrics
Delaunay-based Non-uniform sampling for noisy point cloud
LI Guojun LI Zongchun HOU Dongxing
Journal of Computer Applications    2014, 34 (10): 2922-2924.   DOI: 10.11772/j.issn.1001-9081.2014.10.2922
Abstract347)      PDF (581KB)(464)       Save

To satisfy ε-sample condition for Delaunay-based triangulation surface reconstruction algorithm, a Delaunay-based non-uniform sampling algorithm for noisy point clouds was proposed. Firstly, the surface Medial Axis (MA) was approximated by the negative poles computed by k-nearest neighbors Voronoi vertices. Secondly, the Local Feature Size (LFS) of surface was estimated with the approximated medial axis. Finally, combined with the Bound Cocone algorithm, the unwanted interior points were removed. Experiments show that the new algorithm can simplify noisy point clouds accurately and robustly while keeping the boundary features well. The simplified point clouds are suitable for Delaunay-based triangulation surface reconstruction algorithm.

Reference | Related Articles | Metrics
Hotbox level detection of railway vehicle using fuzzy neural networks
CUI Zhuanling LI Guoning LIN Sen
Journal of Computer Applications    2013, 33 (09): 2566-2569.   DOI: 10.11772/j.issn.1001-9081.2013.09.2566
Abstract662)      PDF (615KB)(359)       Save
Concerning the low accuracy, simple algorithm and multiple parameters but difficulty in modification of hotbox detection of Infrared Train Hotbox Detecting System (THDS), a new hotbox detection model based on fuzzy neural networks was proposed. The model selected three variables as inputs, such as temperature difference, train temperature difference and vehicle temperature difference, and four hotbox grades as outputs. One hundred and twenty-five fuzzy rules and learning algorithm were used to train the fuzzy neural networks, which can be as expert system to detect hot axis. The practical simulation results show that the hotbox detection model using fuzzy neural networks can reduce the number of detecting parameters, and the discrete concordance rate reaches 95%.
Related Articles | Metrics
Real-coded quantum evolutionary algorithm based on cloud model
LI Guozhu
Journal of Computer Applications    2013, 33 (09): 2550-2552.   DOI: 10.11772/j.issn.1001-9081.2013.09.2550
Abstract566)      PDF (632KB)(558)       Save
To deal with the problems of easily falling into local optimum and low accuracy in the quantum evolutionary algorithm, a real-coded quantum evolutionary algorithm based on cloud model (CRCQEA) was proposed by using the characteristics of cloud model randomness and stable disposition. The algorithm used a single-dimensional variation of cloud for rapid global search, and used a multi-dimensional cloud evolution for enhancing local search ability to explore the global optimal solution. Dynamic adjustment of search range and resetting of the chromosomes, on the basis of the evolutionary process of algorithm, can speed up the convergence and prevent falling into local optimum. The simulation results show that the algorithm improves search accuracy and efficiency, and the algorithm is well suitable for the complex function optimization.
Related Articles | Metrics
Improved gravitation search algorithm and its application to function optimization
ZHANG Weiping REN Xuefei LI Guoqiang NIU Peifeng
Journal of Computer Applications    2013, 33 (05): 1317-1320.   DOI: 10.3724/SP.J.1087.2013.01317
Abstract960)      PDF (606KB)(742)       Save
Gravitational Search Algorithm (GSA) easily traps into local optimal solutions and its optimization precision is poor when being applied to function optimization problems. An improved GSA (IGSA) was put forward to solve these problems. It significantly improved the exploration and exploitation abilities of GSA, and had good global and local optimization abilities by introducing opposite learning strategy, elite strategy and boundary mutation strategy. The proposed IGSA had been evaluated on six nonlinear benchmark functions. The experimental results show that, compared with standard GSA, the weighted GSA (WGSA) and Artificial Bee Colony (ABC) algorithms, the IGSA has much better optimization performances in solving various nonlinear functions.
Reference | Related Articles | Metrics
Bi-level programming model of fast fashion product logistics network under pre-sale strategy
LIU Si-jing ZHANG Jin LI Guo-qi
Journal of Computer Applications    2012, 32 (05): 1311-1315.  
Abstract845)      PDF (2462KB)(647)       Save
To solve the location-allocation problem of logistics network with respect to fast fashion products under a pre-sale strategy, a bi-level programming model capturing the behaviors of online retailers and customers was proposed to minimize a total cost arising in the service of the logistics network. Common interests of the online retailer and the customer were considered synthetically. According to the characteristics of proposed model, an interactive fuzzy algorithm was adopted to determine locations, allocations, service plans of warehouse and third-party logistics enterprises. A case study was conducted to validate the feasibility of the proposed model and algorithm. The results reveal that the central warehouse should be located in a place close to areas of dense demand and the online retailers should provide a small amount of third-party logistics enterprices for customer selection.
Reference | Related Articles | Metrics
Equity and real-time traffic signal scheduling algorithm
LI HuiLi GUO Ai-huang
Journal of Computer Applications    2012, 32 (04): 1161-1164.   DOI: 10.3724/SP.J.1087.2012.01161
Abstract408)      PDF (589KB)(367)       Save
The real-time traffic signal scheduling is an important way to improve traffic congestion, and research on its equity is also vital. In view of the common places between the computer communication network and transportation network, drawing the idea of the max-min fairness and proportional fairness, propose a min-max fairness traffic signal scheduling algorithm and a proportional fairness traffic signal scheduling algorithm. Conduct a variety of simulations to compare their performances with the fixed time control and the minimum queue length control algorithms. The results prove that the minimum queue length control and fixed control may not treat every vehicle fairly for it cause a number of vehicles waiting for a comparative long time. Though min-max fairness treats each vehicle fairly, it performs badly when the traffic flow density is high. Proportional fairness shows the good performance both in the aspect of the average delay and fairness. The results provide a solution to control the traffic light in a both efficiency and fair way, which has good value of application.
Reference | Related Articles | Metrics
Nonlinear combinatorial collaborative filtering recommendation algorithm
LI Guo ZHANG Zhi-bin LIU Fang-xian JIANG Bo YAO Wen-wei
Journal of Computer Applications    2011, 31 (11): 3063-3067.   DOI: 10.3724/SP.J.1087.2011.03063
Abstract1125)      PDF (814KB)(599)       Save
Collaborative filtering is the most popular personalized recommendation technology at present. However, the existing algorithms are limited to the user-item rating matrix, which suffers from sparsity and cold-start problems. Neighbours' similarity only considers the items which users evaluate together, but ignores the correlation of item attribute and user characteristic. In addition, the traditional ones have taken users' interests in different time into equal consideration. As a result, they lack real-time nature. Concerning the above problems, this paper proposed a nonlinear combinatorial collaborative filtering algorithm consequently. In order to obtain more accurate nearest neighbour sets, it improved neighbours' similarity calculated approach based on item attribute and user characteristic respectively. Furthermore, the initial prediction rating fills in the rating matrix, so makes it much denser. Lastly, it added time weight to the final prediction rating, so then let users' latest interests take the biggest weight. The experimental results show that the optimized algorithm can increase prediction precision, by way of reducing sparsity and cold-start problems, and realizing real-time recommendation effectively.
Related Articles | Metrics
Fast block matching algorithm based on subblock mean
Li Guo XingHua Sun
Journal of Computer Applications   
Abstract1854)      PDF (602KB)(1119)       Save
Integral image as an intermediate image representation can be used to calculate the sum of gray level in rectangle quickly. A novel partial matching error function was presented based on sub block mean. Experiments show that the matching error function based on subblock mean is superior to both the full matching error function and the matching error function based on sub sampling in terms of the motion estimation quality and speed. The matching error function based on subblock mean with the constant subblock division costs almost the same time to different matching images, which is very suitable for the realtime applications such as video compression.
Related Articles | Metrics
XML metadata retrieval based an approximately matching model
OUYANG Liu-bo,LI Xue-yong,YANG Guang-zhong,LI Guo-hui
Journal of Computer Applications    2005, 25 (04): 820-823.   DOI: 10.3724/SP.J.1087.2005.0820
Abstract1374)      PDF (221KB)(1036)       Save
This thesis took apart the unordered label tree matching into tree structure matching and tree label semantic matching. By combined with the tree structure matching and semantic matching, the thesis changed the traditional tree matching algorithms into approximately matching, and a metadata retrieval method based on three-level tree approximately matching model was put forward. According to this new retrieval method, the accuracy and recall rates would be adjusted by different requirement of users. In the end, this thesis brought forward the retrieval process of XML-oriented metadata, and gave out the applied design of metadata approximately match. The results of experiments prove that the approximately matching model is feasible and efficient in the application of retrieval XML metadata.
Related Articles | Metrics